专利摘要:
The invention relates to a method for validating (300) the use of a real finger as support for a fingerprint, said validation method (300) comprising: a positioning step (302) during which the support is placed in support, - a capture step (304) during which a so-called captured image of the fingerprint is captured, - a filtering step (306) during which the captured image is transformed into an image resultant by passing through a low-pass filter, - a location step (308) during which an origin point O whose pixel intensity is representative of the maximum pressure exerted on the support is located on the resulting image, - a verification step (310) during which it is verified that, on the resulting image, for a plurality of rays originating from the origin point O, and for each of said rays, for a plurality of points M, the pixel intensity of each point M said radius is representative of a decrease in the pressure exerted on the support as the distance from the origin point O to the point M increases, and - a decision-making step (312) in which a A decision as to the validity of the medium is made based on the results of the verification step (310).
公开号:FR3017230A1
申请号:FR1450834
申请日:2014-02-04
公开日:2015-08-07
发明作者:Alain Thiebot;Benoit Thouy;Jean-Francois Boulanger;Julien Doublet
申请人:Morpho SA;
IPC主号:
专利说明:

[0001] The present invention relates to a method for validating the use of a real finger as support for a fingerprint, and a validation device implementing such a method. A device for identifying an individual by his fingerprint consists of a sensor, a means of comparison, a means of decision-making. The sensor has a capture surface for supporting the finger and through which an image of the fingerprint of the finger is captured. The comparison means compares the captured image or the biometric templates which are derived from the image with the biometric images or templates of a database which gathers the images or templates of persons previously registered in the identification device. . The decision-making means is intended to make a decision as to the identification of the individual from the result of the comparisons. Several technologies commonly exist in the field of fingerprint sensors for capturing finger images in contact with an acquisition surface, in particular optical, capacitive, electrical, thermal, ultrasonic, or electrical field measurements. again by pressure measurement. Some malicious people try to be fraudulently identified by using decoys to mislead the identification device. Various validation methods are known to validate the fact that the finger 20 carrying the fingerprint is a real finger. In particular, it is known to use the deformation of the finger on the sensor to check if it corresponds to skin whose elastic characteristics are different from those of the materials used to make the lures. In particular, it is known to turn the finger on the capture surface in order to induce image distortion which makes it possible to analyze the elasticity of the skin or the support material of the fraud. But such a method is not very ergonomic because such a movement must be explained to the individual wanting to be identified, which is not possible, for example in the case of an identification device which is not not supervised. An object of the present invention is to propose a validation method making it possible to validate the use of a real finger as a carrier of a fingerprint which does not have the disadvantages of the state of the art, and which, in particular, presents a great ergonomics for the individual.
[0002] For this purpose, there is provided a method of validating the use of a real finger as support for a fingerprint implemented by a validation device comprising a capture surface on which said support bears, a sensor for capturing an image of the fingerprint, a processing module and a decision-making module, said validation method comprising: a positioning step during which the support is placed against the capture surface; a capture step in which the sensor captures a so-called captured image of the fingerprint. IO - a filtering step during which the processing module transforms the captured image into a resultant image by passing through a low pass filter whose cutoff frequency is much lower than the frequency of the peaks of a cavity, a location step during which the processing module locates on the resulting image, an origin point O whose pixel intensity is representative of the maximum pressure exerted on the support, a step of verification during from which the processing module verifies that. on the resulting image, for a plurality of rays originating from the origin point 0, and for each of said rays, for a plurality of points M, the intensity of the pixel of each point M of said radius is representative of a decrease in the pressure exerted on the support as the distance from the point of origin O to the point M increases. and - a decision-making step in which the decision-making module makes a decision as to the validity of the medium based on the results of the verification step. Advantageously, said cutoff frequency is of the order of 0.1 to 1 cycle per mm. Advantageously, the verification step consists in verifying that from the origin point O and on each ray coming from the origin point O, the intensity gradient of each point M is negative. Advantageously, the decision-making step is based on a cumulative criterion noted Dism (P, 0) and representing the difference in the intensity profile of the actual intensity profile P with the theoretical intensity model centered in 0 of a real finger and on a comparison of this difference Dism (P, 0) with respect to a threshold. Advantageously, the difference Dism (P, 0) is given by the formula: ## EQU1 ## dr. of (P) (3) where: - ViqP (r, 0) is the projection on the radial local base in M relative to O due intensity gradient at the point M of the resulting image of polar coordinates' (t-, 0). L is a null function on Ir and increasing on Ile *, - is the maximum distance between the origin point O and any point M of the resulting image, and - Area (P) is the area of the considered region around the point M. Advantageously, the validation method comprises a preprocessing step intended to improve the rendering of the resulting image, between the capture step and the filtering step. According to a particular embodiment, the preprocessing step consists in applying to the pixels (x, y) of the captured image a function F (x, y) defined by the formula: F (x, = (255 - p (x, y)) .Mx, y) Rir (x, y) (7) where when the pixel (x, y) does not belong to the peak skeleton, R, (x, y) is zero, and where when the pixel (x, y) belongs to the peak skeleton, R, (x, y) is the local width of the peak at this pixel (x, y), where when the pixel (x, y) does not belong at the skeleton of the valleys, Rir (x, y) is zero, and where when the pixel (x, y) belongs to the skeleton of the valleys. Rir (x, y) is the: local width of the valley in this pixel (x, y). in another particular embodiment, the preprocessing step consists in applying to the pixels (x, y) of the captured image a function F (x, y) defined by the formula: F (x, y) = (255 - p (x, y)). Rr (x, y) R ir (x, y) (7) where Rr (x, y) is the number of peak pixels in an area centered on the pixel (x, y), and where Rir (x, y) ) is the number of valley pixels in an area centered on the 3 pixel Advantageously, the location step consists of choosing the origin point O as the point of the resulting image having the highest intensity. The invention also proposes a validation device comprising: a capture surface on which said support bears. a sensor intended to capture an image of the fingerprint; a processing module comprising: filtering means intended to transform the captured image into a resultant image by passing through a low-pass filter whose frequency of cutoff is much lower than the frequency of the peaks of a print, - location means for locating on the resulting image, an origin point O whose pixel intensity is representative of the maximum pressure exerted on the support verification means for verifying that, on the resulting image, for a plurality of rays originating from the origin point O, and for each of said rays, for a plurality of points M, the intensity of the pixel of each point M said radius is representative of a decrease in the pressure exerted on the support as the distance from the origin point O to the point M increases, and - a decision-making module intended to take u deciding on the validity of the support based on the results transmitted by the verification means. The characteristics of the invention mentioned above, as well as others, will appear more clearly on reading the following description of an exemplary embodiment, said description being made in connection with the attached drawings, among which: FIG. . 1 is a schematic representation of a validation device according to the invention, FIG. 2 is an image of a fingerprint captured by a validation device according to the invention, FIG. 3 is an algorithm of a validation method according to the invention, FIG. 4 is an image of the fingerprint of FIG. 2 as modified during the validation process, and FIG. 5 shows a diagram serving as a calculation support. A real finger has a special elasticity which, when pressurized against a capture surface, presents a homogeneous deformation which consists of a zone of maximum pressure and a decrease of the pressure as one s' away from this area. The uniqueness of this area comes from the fact that it can only exert a constant pressure on the whole finger and the cylindrical shape of the finger makes the pressure decreases when approaching the edges of the finger.
[0003] When a finger is covered with a lure or the finger is false, the elasticity of the lure is different from the elasticity of a real finger, and when the lure is put under pressure against a capture surface, it does not does not deform homogeneously and there are then several areas of high pressure separated by areas of low pressure. The principle of the invention therefore consists in looking for a source point of high pressure, and in verifying that from this origin point the pressure decreases. Fig. 1 shows a validation device 100 which is intended to validate the use of a real finger as a holder of a fingerprint.
[0004] The validation device 100 comprises: a capture surface 102 on which a support carrying a fingerprint is placed in abutment; a sensor 104 intended to capture the image of the fingerprint through the capture surface; 102, a processing module 106 intended to receive the image of the fingerprint captured by the sensor 104 and to process it as described below, and a decision-making module 108 for making a decision on the fact that the support 10 is a real finger or a false finger, from the information transmitted by the processing module 106.
[0005] Fig. 2 is a captured image 200 of a fingerprint as captured by the sensor 104 and transmitted to the processing module 106. Conventionally, the print has ridges and valleys. Here, the captured image 200 is in gray level and the peaks appear here in black on a white background. Fig. 4 shows an image 400 which results from the passage of the captured image 200 by a filtering step (306, Fig. 3). The ridges and valleys are no longer discernible. The image 400 is representative of the pressures exerted in all points of the support 10 and is hereinafter referred to as the resulting image 400. In the embodiment of the invention presented here, plus the area of the resulting image 400 is clear, the greater the pressure in this area.
[0006] Fig. 3 is an algorithm of a validation process 300 implemented by the validation device 100. The validation process 300 comprises: a positioning step 302 during which the carrier carrying the impression is placed in abutment against the capture surface 102, - a capture step 304 during which the sensor 104 captures the captured image 200 of the fingerprint, - a filtering step 306 during which the processing module 106 transforms the image captured 200 in the resulting image 400 by passing through a low-pass filter whose cutoff frequency is much lower than the frequency of the peaks of an imprint, - a locating step 308 during which the processing module 106 locates on the resulting image 400, an origin point O whose pixel intensity is representative of the maximum pressure exerted on the support 10, - a verification step 310 during which the processing module 10 6 verifies that, on the resulting image 400, for a plurality of rays 402 originating from the origin point O, and for each of said rays 402, for a plurality of points M, the intensity of the pixel of each point M of said radius 402 is representative of a decrease of the pressure exerted on the support 10 as the distance from the point of origin O to the point M increases, and - a decision-making step 312 during which the module of decision of decision 108 makes a decision as to the validity of the support 10 according to the results of the verification step 310. The validation process 300 thus allows a use without constraint for the person presenting his finger 10. The low-pass filter will be by for example a Gaussian filter, a median filter, an averaging filter or any other filter making it possible to retain only the low frequency information of the captured image 200. The locating step 308 consists in choosing the origin point O as the p anointed with the resulting image 400 having the highest intensity. The processing module 106 comprises: filtering means for transforming the captured image 200 into the resulting image 400 by passing through a low-pass filter whose cutoff frequency is much lower than the frequency of the peaks; an imprint, - locating means intended to locate on the resulting image 400, an origin point O whose pixel intensity is representative of the maximum pressure exerted on the support 10, and - verification means for verifying that, on the resulting image 400, for a plurality of rays 402 originating from the origin point O, and for each of said rays 402, for a plurality of points M, the intensity of the pixel of each point M of said radius 402 is representative a decrease in the pressure exerted on the support 10 as the distance from the origin point O to the point M increases. The decision-making module 108 is intended to make a decision as to the validity of the support 10 as a function of the results transmitted by the verification means. A cavity conventionally has a peak frequency of the order of 1.5 to 3 cycles per mm and to obtain an exploitable image after the filtering step 306, the cutoff frequency that is applied during this filtering step 306 is on the order of 0.1 to 1 cycle per mm, and more particularly 0.5 cycle per mm. In the resulting image 400, the intensity of each pixel is representative of the pressure exerted on the point of the support 10 whose image is said pixel. According to a particular embodiment of the invention, the verification step 310 consists of using the intensity gradients of the pixels within the resulting image 400 and verifying that from the origin point O and on each ray 402 from the origin point 0, the intensity gradient of each point M oriented in the direction OM is negative. If the gradient is positive away from the zero origin point, this is an indication that the support 10 may be a fake finger. The intensity gradient corresponds to the pressure gradient exerted on the support 10.
[0007] Since it may happen that, even with a real finger, the gradient back slightly on a portion of a radius 402, for example because of a particularity of the finger (scar, ...), or lighting parasite, the decision-making step 312 is preferably based on a cumulative criterion noted Dism (P, 0) and representing the difference in the intensity profile of the actual intensity profile P with the theoretical intensity model centered in O with a real finger, and a comparison of this difference with respect to a threshold. If the difference Dism (P, 0) is greater than the threshold, the support 10 is then considered as a false finger, and if the difference Dism (P, 0) is below the threshold, the support 10 is then considered as a true finger. The result of the comparison between the threshold and the difference Dism (P, 0) serves as a basis during the decision-making step 312. One method consists in working in a radial local base at each point M of the resulting image 400. The radial local base in M relative to O is the basis (r, é) oM il such that f> = and t is the unitary vector orthogonal to f> such that (0, f>, é) is a mom direct landmark.
[0008] M is a point on the resulting image 400 and we denote (M) the intensity gradient in M. This gradient expressed in the image reference can be projected on the radial local base in M relative to O, and its projection on the radial local base in M relative to O on the vector f> is written 17 °. (M).
[0009] In the case of a real finger, that is to say in the case of an ideal intensity profile, the intensity profile normally comprises a unique local maximum noted O and any local gradient projected in its local base. radial axis in M relative to O is then and thus fulfills the equation: VM, 0 ° P (M) <0 (1) where P (M) is the intensity at the point M. 1e / 3 (M) is thus the projection on the radial local base in M relative to O of the intensity gradient at the point M of the resulting image 400. By considering the polar coordinates coordinate of center O, the coordinates of the point M in this reference are (r, 0) and equation (1) is written: VM (r, 0), r> 0, 0 E [0.27 /], 17 ° P (r, 0) <0 (2). This corresponds to the fact that the intensity gradient profile along any radius 402 starting from the origin point O is decreasing. We choose a real function L such that it is zero on I and increasing on I cl where Rmax is the maximum distance between the origin point O and any point M of the resulting image 400 and where Air e (P) is the area of the region considered around the point M and which is here in pixels. Several methods can be used to calculate the difference Dism (P, 0). Each calculation method offers a compromise between the speed of calculation and the accuracy of the calculation. In other words, it is possible to choose all the points M to have a very precise value of the difference Dism (P, 0), but in this case the calculation is long, or it is For example, one can choose the function L such that: L (x) = fx if x> 0 LO if x <0 L is a function of selection of the positive gradients, but another function could make it possible to weight the gradients according to their intensity. The difference D ism (P, 0) is then given Rpmaraxla formula: 1 2 Dism (P, 0) = Area (P) Lee P (r, 0)). dr. d0 (3) possible to limit the number of points M to have a fast calculation but to the detriment of the precision in the computation. To calculate exactly the integral, for each pixel M of the resulting image 400, the local intensity gradient is calculated and projected on the radial basis relative to the origin point O. The sum of all the local projections of the gradients 17 °. P (r, 6) which are strictly positive is carried out. In Cartesian coordinates and taking a width W and a height H for the resulting image 400, formula (3) is written: WH Dism (P, 0) = 1 Area (P) Area (M (x, y) ). L (V ' P (x, y)) (4) x = 0 y = 0 where L is the function defined above and which keeps only the positive values.
[0010] This amounts to calculating the local gradients over the resulting image 400 and summing the projections on the local radial bases which are positive. Area (M (x, y)) is the area of each area centered on the point M (x, y) on which the gradient is calculated. Here, this area is equal to one pixel, but it is possible to subsample the calculation so as not to consider all the pixels and speed up the calculation and in this case the area is greater than 1. Another method is to sample the angles of integration and calculate the gradients only according to the chosen radii. According to each ray, the local gradients are calculated with an interpolation method, and then the sum of the positive gradients is calculated. Each gradient can be weighted by the area of the crown sector it represents. For example, in the case where the gradient calculation points are sampled uniformly with a pitch of 1 pixel, and where O uniformly sampled angles are selected over the interval [0.27c] and which are generally 8 in number or 16, the formula (3) approximates by the formula: oi Rmax Dism (P, 0) 1 Area (P) Area (Ri). P os (V ' P (r, 6)) e = o 1 = 1 (5) and as the area of a crown sector of mean radius j, of thickness 1 and angle -2: -2: X 2j, the formula (5) becomes: O-1 Rmax Dism (P, 0) - 47r goes Aire (P) e = oj = 1 jL (V ° P (r, 0)) (6) L The difference Dism (P, 0) is not bounded and represents the divergences with the theoretical model and it is then possible to define a threshold for the difference Dism (P, 0) beyond which the decision-making module 108 will consider the support 10 as a false finger and below which it will consider the support 10 as a real finger. This threshold can be determined from a representative base of real fingers or from a base of true and false fingers. For example, a neural network or an SVM will be used. The decision threshold is defined from a measurement defined on a representative basis of real fingers. Between the capture step 304 and the filtering step 306, a preprocessing step 305 may be set up to improve the rendering of the resulting image 400.
[0011] The processing module 106 then comprises pretreatment means intended to implement the pretreatment step 305. From the captured image 200, the processing module 106 determines an image S representative of a skeleton of ridges (ridges ) and a skeleton of valleys (inter-ridges). Such a determination is described for example in [Alessandro Farina, Zsolt M. Kovacs-Vajna, Alberto Leone, "Fingerprint Minutiae Extraction from Skeletonized Binary Images," Pattern Recognition, Vol. 32, pp. 877889, 1999]. At each point of the peak skeleton, the processing module 106 calculates the local width of the peak, and at each point of the valley skeleton, the processing module 106 calculates the local width of the valley. Such calculations are illustrated in FIG. 5 which shows skeletons of ridges 502 and skeletons of valleys 504 in grayscale. For each point P of the peak skeleton 502, the width "d" of the corresponding peak is measured according to the normal to the skeleton, d1 and d2 being the widths of the valleys.
[0012] The processing module 106 then builds a matrix of ridges and noted Rr and a matrix of valleys and noted Rir. The matrix Rr and the matrix Rir are the same size as the captured image 200 and each line x and column coefficient y corresponds to the line pixel x and column y of said captured image 200 and denoted (x, y). For each pixel (x, y) not belonging to the peak skeleton, the corresponding coefficient Rr (x, y) of the peak matrix Rr is zero and for each pixel (x, y) belonging to the peak skeleton, the corresponding coefficient Rr (x, y) of the peak matrix Rr is the local width of the peak in this pixel (x, y). For each pixel (x, y) not belonging to the skeleton of the valleys, the corresponding coefficient Rir (x, y) of the matrix of the valleys Rir is zero and for each pixel (x, y) belonging to the skeleton of the valleys, the corresponding coefficient Rir (x, y) of the valley matrix Rir is the local width of the valley in this pixel (x, y). Another method of defining Rr (x, y) and Rir (x, y) is to define the density of peaks (or valleys) around the pixel (x, y). Rr can thus be defined as the number of peak pixels in an area centered on the pixel (x, y) while Rir (x, y) can be defined as the number of pixels of valleys in an area centered on the pixel ( x, y). Rr and Rir are thus no longer defined as distances as in the first method described but by a notion of density. The function F is a function that transforms the captured image 200 into a pretreated image denoted IP and which is a monotonic function of the intensity of the pixels (x, y) of the captured image 200, and which is such that for any pixel (x, y) of the captured image 200 belonging to one of the two backbones, the pixel (x, y) of the pretreated image IP takes a positive intensity value, and for any pixel (x, y) of the captured image 200 does not belong to one of the two skeletons, the pixel (x, y) of the pretreated image IP takes a value of zero intensity. For example, it is possible to take the function F (x, y) defined by the formula: F (x, y) = (255 - p (x, y)). Rr (x, y) (7) Rir (x, y) The pretreated image IP can then undergo the filtering step 306. It is also possible to perform simultaneously the preprocessing step 305 and the filtering step 306. For example, it is possible to use the function F (x, y) defined by the formula: ry) F (x, y) = GE * (a (255 - p (x, y)) + RR ir ( x, (x, y)) with a + = 1 (8) That is, the function F is the convolution of the weighted sum of the captured image 200 and the peak width ratio image. By width of the valleys, by a Gaussian covariance matrix In practice, a is chosen between 0 and 0.5, the resulting image can then be directly assimilated to the resulting image 400. It may happen that the resulting image 400 has a plurality of points and each of these points has a pixel intensity which is representative of a maximum pressure, the resulting image 400 then has a plurality of O origins.
[0013] The notion of local maximum pressure around each origin point O is then used. The locality of the search zone for an origin point O can be defined for example by the minimum distance between this point of origin and all other points of origin. origins. This distance can be adjusted by a coefficient so that the area is far from other origins. A practical value of this coefficient is 2.
[0014] In the case where several points origins O are defined, the difference can be defined by making several times the difference calculation for each point of origin O. The consolidation is done by analysis of the different results on each origin point O (for example min , max, average, ...). The final decision is made by thresholding the gap defined after this consolidation. Of course, the present invention is not limited to the examples and embodiments described and shown, but it is capable of many variants accessible to those skilled in the art. 13
权利要求:
Claims (10)
[0001]
CLAIMS1) Method for validating (300) the use of a real finger as support (10) of a fingerprint implemented by a validation device (100) comprising a capture surface (102) on which said support (10) is supported, a sensor (104) for capturing an image (200) of the fingerprint, a processing module (106) and a decision-making module (108), said validation method (300). ) comprising: - a positioning step (302) during which the support 10 is read pressed against the capture surface (102), - a capture step (304) during which the sensor (104) captures a captured ima (200) of the imprint, - a filtering step (306) during which the processing module (106) transforms the captured image (200) into a resultant image (400) by passing through i5 a low-pass filter whose cutoff frequency is much lower than the frequency of the peaks of a fingerprint te, - a location step (308) during which the processing module (106) locates on the resulting image (400), an origin point O whose pixel intensity is representative of the maximum pressure exerted on the support (10), - a verification step (310) during which the processing module (1 06) verifies that, on the resulting image (400), for a plurality of radii (402) coming from the origin point O. and for each of said rays (402), for a plurality of points M. the intensity of the pixel of each point M of said radius (402) is representative of a decrease in the pressure exerted on the support ( 10) as the distance from the origin point O to the point M increases, and - a decision-making step (312) in which the decision-making module (108) makes a decision as to the validity of the support (10) according to the results of the verification step (310).
[0002]
2) validation method (300) according to claim 1, characterized in that said cutoff frequency is of the order of 0.1 to 1 cycle per mm.
[0003]
3) validation process (300) according to one of claims 1 or 2, characterized in that the verification step (310) consists in checking that from the origin point 0 14 3017230 and on each spoke (402) from the origin point 0, the intensity gradient of each point M is negative.
[0004]
4) validation method (300) according to claim 3, characterized in that the decision-making step (312) is based on a cumulative criterion noted D ism (P, 0) and representing the deviation of the profile of intensity of the actual intensity profile P to the theoretical intensity model centered at 0 of a real finger and a comparison of this difference D ism (P, 0) with respect to a threshold.
[0005]
5) validation process (300) according to claim 4, characterized in that the deviation D ism (P, 0) is given by the formula: D ism (P, 0) = Area (P) JO rn r JORmax L (0P (r, e)). dr. hence: - V ic). P (r, 0) is the projection on the radial local base at M relative to O of the intensity gradient at the point M of the resulting image (400) of polar coordinates (r, 0). - L is a null function on IR- and increasing on ife *, - R ',', is the maximum distance between the origin point O and any point M of the resulting image (400), and - Area (P) is the area of the region considered around point M.
[0006]
6) validation process (300) according to one of claims I to 5. characterized in that it comprises a pretreatment step (305) for improving the rendering of the resulting image (400) between step capture (304) and the filtering step (306).
[0007]
7) Validation method (300) according to claim 6, characterized in that the preprocessing step (305) consists in applying to the pixels (x, y) of the captured image (200). a function F (x. y) defined by the formula: F (x, y) = (255 - p (x, y)). 12Ri,. (X, y) y) (7) where when the pixel (x, y) does not belong to the peak skeleton, R, (x, y) is zero, and where when the pixel (x, y ) belongs to the peak skeleton, R, (x, y) is the local width of the peak in this pixel (x, y), where when the pixel (x, y) does not belong to the skeleton of the valleys, R, r (x, y) is zero, and where when the pixel (x, y) belongs to the skeleton of the valleys, Rir (x, y) is the local width of the valley in this pixel (x, y).
[0008]
8) validation process (300) according to claim 6. characterized in that the pretreatment step (305) consists in applying to the pixels (x, y) of the captured image (200). a function F (x, y) defined by the formula: F (x, = (255 - Rr (xy) y)). Rir (x (7) where Rf (x, y) is the number of peak pixels in an area centered on the pixel (xy), and where Ri, (x, y) is the number of valleys in an area centered on the pixel (x, y).
[0009]
9) validation process (300) according to one of claims 1 to 8, characterized in that the locating step (308) consists in choosing the origin point O as the resultant image point (400) having the strongest intensity.
[0010]
10) Validation device (100) comprising: - a capture surface (102) on which said support (10) bears. a sensor (104) for capturing an image (200) of the fingerprint. a processing module (106) comprising: filtering means for transforming the captured image (200) into a resultant image (400) by passing through a low pass filter whose cutoff frequency is much lower at the frequency of the peaks of an imprint, - locating means intended to locate on the resulting image (400), an origin point O whose intensity of the pixel is representative of the maximum pressure exerted on the support ( 10), - verification means for verifying that. on the resulting image (-100). for a plurality of rays (402) from the origin point O, and for each of said rays (402). for a plurality of points M, the intensity of the pixel of each point M of said radius (402) is representative of a decrease in the pressure exerted on the support (10) as the distance from the origin point 0 at point M .0 increases, and 16 3017230 - a decision-making module (108) for making a decision as to the validity of the medium (10) as a function of the results transmitted by the erecting means.
类似技术:
公开号 | 公开日 | 专利标题
EP2902943B1|2016-10-12|Method for validating the use of a real finger as a support for a fingerprint
FR2884947A1|2006-10-27|Eye iris three-dimensional shape acquisition method for securing e.g. building, involves capturing image of iris by image sensor, analyzing hue of light intensities generated by light beam on iris and forming three-dimensional shape of iris
EP2901370B1|2016-07-06|Method for detecting a real face
EP2771841B1|2015-12-16|Anti-fraud device
FR2915301A1|2008-10-24|PROCESS FOR COMPARISON OF IMAGES OF A BIOMETRY BETWEEN A REFERENCE IMAGE AND AT LEAST ONE TEST IMAGE WHICH WE SEEK TO EVALUATE A DEGREE OF CORRELATION WITH THE REFERENCE IMAGE
JP3846582B2|2006-11-15|Fingerprint authentication method / program / device
JP2004118676A|2004-04-15|Fingerprint authenticating method/program/device
EP2425375B1|2014-10-29|Device for identifying a person by a print thereof
FR3065306A1|2018-10-19|METHOD OF DETECTING FRAUD
EP2652674A1|2013-10-23|Method of comparing images of irises by intelligent selection of textured zones
FR3053500B1|2019-06-28|METHOD FOR DETECTING FRAUD OF AN IRIS RECOGNITION SYSTEM
EP3073416B1|2018-03-14|Device for checking the veracity of a fingerprint
Destruel et al.2018|Color noise-based feature for splicing detection and localization
FR3040815A1|2017-03-10|METHOD OF CHARACTERIZING A MATERIAL BY TAVELURE ANALYSIS
JP4711131B2|2011-06-29|Pixel group parameter calculation method and pixel group parameter calculation apparatus
WO2021209412A1|2021-10-21|Method for detecting an attack by presentation for fingerprints
EP2901366A1|2015-08-05|Method for detecting the reality of venous networks for the purposes of identifying individuals, and biometric recognition method
EP3579141A1|2019-12-11|Method for processing a fingerprint impression image
EP3608836B1|2021-01-06|Method for obtaining a digital fingerprint image
CA2879218C|2022-03-08|Method of validation of the use of a real finger as support of a fingerprint
Yarlagadd et al.2015|A comparative study of fractal dimension based age group classification of facial images with different testing strategies
Kumar2018|Representation, Recovery and Matching of 3D Minutiae Template
EP3320481A1|2018-05-16|Method for verifying the veracity of a finger
Belguechi et al.2011|Study of the robustness of a cancelable biometric system
Czovny et al.2018|Minutia Matching using 3D Pore Clouds
同族专利:
公开号 | 公开日
KR102313794B1|2021-10-15|
FR3017230B1|2016-03-11|
EP2902943B1|2016-10-12|
IN2015DE00251A|2015-08-07|
BR102015002106A2|2015-09-08|
MX2015001312A|2016-07-08|
MX350156B|2017-08-29|
AU2015200505B2|2019-10-10|
US9471827B2|2016-10-18|
AU2015200505A1|2015-08-20|
EP2902943A1|2015-08-05|
KR20150092009A|2015-08-12|
US20150220771A1|2015-08-06|
CN104820819B|2019-10-11|
ZA201500755B|2016-01-27|
CA2879218A1|2015-08-04|
CN104820819A|2015-08-05|
ES2609050T3|2017-04-18|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US20100066697A1|2007-03-14|2010-03-18|Axsionics Ag|Pressure measurement device and corresponding method|
EP2518684A1|2009-12-22|2012-10-31|Nec Corporation|Fake finger assessment device|
FR2981769A1|2011-10-25|2013-04-26|Morpho|ANTI-FRAUD DEVICE|
FR2774793B1|1998-02-12|2002-08-30|Bull Cp8|PROCESS FOR PRODUCING AN IMAGE BY MEANS OF A PORTABLE OBJECT, PORTABLE OBJECT AND DEVICE FOR IMPLEMENTING THE PROCESS|
JP2002222424A|2001-01-29|2002-08-09|Nec Corp|Fingerprint matching system|
US20080095412A1|2004-09-13|2008-04-24|The Ritsumeikan Trust|Method And System For Extracting Liveliness Information From Fingertip|
CN100573553C|2007-01-18|2009-12-23|中国科学院自动化研究所|Method for detecting living body fingerprint based on thin plate spline deformation model|
GB2450479A|2007-06-22|2008-12-31|Warwick Warp Ltd|Fingerprint recognition including preprocessing an image by justification and segmentation before plotting ridge characteristics in feature space|
JP4569616B2|2007-10-04|2010-10-27|富士ゼロックス株式会社|Image processing apparatus and collation system|
CN101408935A|2008-11-27|2009-04-15|上海第二工业大学|Method for rapidly extracting fingerprint characteristics based on capturing effective domain|
US10445555B2|2009-01-27|2019-10-15|Sciometrics, Llc|Systems and methods for ridge-based fingerprint analysis|
WO2011152213A1|2010-06-04|2011-12-08|日本電気株式会社|Fingerprint authentication system, fingerprint authentication method, and fingerprint authentication program|
JP2013535998A|2010-07-13|2013-09-19|スコット マクナルティ|System, method, and apparatus for sensing biometric information|
US8724861B1|2010-12-06|2014-05-13|University Of South Florida|Fingertip force, location, and orientation sensor|
JP5699845B2|2011-07-29|2015-04-15|富士通株式会社|Biological information processing apparatus, biological information processing method, and computer program for biological information processing|
JP5944712B2|2012-03-26|2016-07-05|日立オムロンターミナルソリューションズ株式会社|Vein authentication system, vein authentication apparatus and vein authentication method|
US9245165B2|2013-03-15|2016-01-26|Google Technology Holdings LLC|Auxiliary functionality control and fingerprint authentication based on a same user input|
CN103279744B|2013-05-28|2016-08-10|中国科学院自动化研究所|Imitation fingerprint detection methods based on multiple dimensioned three mode texture feature and system|US9424458B1|2015-02-06|2016-08-23|Hoyos Labs Ip Ltd.|Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices|
CN110326001A|2016-12-08|2019-10-11|维里迪乌姆Ip有限责任公司|The system and method for executing the user authentication based on fingerprint using the image captured using mobile device|
US10339362B2|2016-12-08|2019-07-02|Veridium Ip Limited|Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices|
US11263432B2|2015-02-06|2022-03-01|Veridium Ip Limited|Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices|
FR3034224B1|2015-03-23|2018-03-23|Morpho|DEVICE FOR VERIFYING THE VERACITY OF A DIGITAL FOOTPRINT|
US9946917B2|2016-03-31|2018-04-17|Synaptics Incorporated|Efficient determination of biometric attribute for fast rejection of enrolled templates and other applications|
CN106127172B|2016-06-29|2019-08-09|中控智慧科技股份有限公司|A kind of device and method of non-contact 3D fingerprint collecting|
US10460144B2|2016-07-20|2019-10-29|Cypress Semiconductor Corporation|Non-finger object rejection for fingerprint sensors|
US10599911B2|2016-07-20|2020-03-24|Cypress Semiconductor Corporation|Anti-spoofing protection for fingerprint controllers|
US10846501B2|2017-04-28|2020-11-24|The Board Of Trustees Of The Leland Stanford Junior University|Acoustic biometric touch scanner|
US10984085B2|2017-08-03|2021-04-20|Bio-Key International, Inc.|Biometric recognition for uncontrolled acquisition environments|
CN109074489A|2018-07-20|2018-12-21|深圳市汇顶科技股份有限公司|Method, fingerprint identification device and the electronic equipment of fingerprint recognition|
CN110896433A|2018-09-12|2020-03-20|上海耕岩智能科技有限公司|Light source driving method applied to under-screen image imaging, storage medium and electronic equipment|
TWI673655B|2018-11-13|2019-10-01|大陸商北京集創北方科技股份有限公司|Sensing image processing method for preventing fingerprint intrusion and touch device thereof|
CN109815935B|2019-02-20|2022-01-11|Oppo广东移动通信有限公司|Electronic device, fingerprint verification method and related product|
CN113240724A|2021-05-14|2021-08-10|长江存储科技有限责任公司|Thickness detection method and related product|
法律状态:
2015-02-20| PLFP| Fee payment|Year of fee payment: 2 |
2016-01-21| PLFP| Fee payment|Year of fee payment: 3 |
2017-01-24| PLFP| Fee payment|Year of fee payment: 4 |
2018-01-23| PLFP| Fee payment|Year of fee payment: 5 |
2020-01-22| PLFP| Fee payment|Year of fee payment: 7 |
2021-01-20| PLFP| Fee payment|Year of fee payment: 8 |
2022-01-19| PLFP| Fee payment|Year of fee payment: 9 |
优先权:
申请号 | 申请日 | 专利标题
FR1450834A|FR3017230B1|2014-02-04|2014-02-04|METHOD FOR VALIDATING THE USE OF A TRUE FINGER AS A SUPPORT OF A DIGITAL FOOTPRINT|FR1450834A| FR3017230B1|2014-02-04|2014-02-04|METHOD FOR VALIDATING THE USE OF A TRUE FINGER AS A SUPPORT OF A DIGITAL FOOTPRINT|
CA2879218A| CA2879218A1|2014-02-04|2015-01-21|Method of validation of the use of a real finger as support of a fingerprint|
US14/602,880| US9471827B2|2014-02-04|2015-01-22|Method of validation of the use of a real finger as support of a fingerprint|
EP15152049.1A| EP2902943B1|2014-02-04|2015-01-22|Method for validating the use of a real finger as a support for a fingerprint|
ES15152049.1T| ES2609050T3|2014-02-04|2015-01-22|Procedure for validating the use of an authentic finger to support a fingerprint|
MX2015001312A| MX350156B|2014-02-04|2015-01-28|Method of validation of the use of a real finger as support of a fingerprint.|
IN251DE2015| IN2015DE00251A|2014-02-04|2015-01-28|
BR102015002106A| BR102015002106A2|2014-02-04|2015-01-29|validation method of using a real finger to support a fingerprint|
KR1020150016221A| KR102313794B1|2014-02-04|2015-02-02|Method of validation of the use of a real finger as support of a fingerprint|
ZA2015/00755A| ZA201500755B|2014-02-04|2015-02-02|Method of validation of the use of a real finger as support of a fingerprint|
AU2015200505A| AU2015200505B2|2014-02-04|2015-02-03|Method of validation of the use of a real finger as support of a fingerprint|
CN201510055693.9A| CN104820819B|2014-02-04|2015-02-03|Verifying uses true finger as the verification method of the carrier of fingerprint and verifying device|
[返回顶部]